Structured Sparsity and Convex Optimization
نویسنده
چکیده
The concept of parsimony is central in many scientific domains. In the context of statistics, signal processing or machine learning, it takes the form of variable or feature selection problems, and is commonly used in two situations: First, to make the model or the prediction more interpretable or cheaper to use, i.e., even if the underlying problem does not admit sparse solutions, one looks for the best sparse approximation. Second, sparsity can also be used given prior knowledge that the model should be sparse. In these two situations, reducing parsimony to finding models with low cardinality turns out to be limiting, and structured parsimony has emerged as a fruitful practical extension, with applications to image processing, text processing or bioinformatics. In this talk, I will review recent results on structured sparsity, as it applies to machine learning and signal processing.
منابع مشابه
Solving Structured Sparsity Regularization with Proximal Methods
Proximal methods have recently been shown to provide effective optimization procedures to solve the variational problems defining the !1 regularization algorithms. The goal of the paper is twofold. First we discuss how proximal methods can be applied to solve a large class of machine learning algorithms which can be seen as extensions of !1 regularization, namely structured sparsity regularizat...
متن کاملStructured Sparsity: Discrete and Convex approaches
Compressive sensing (CS) exploits sparsity to recover sparse or compressible signals from dimensionality reducing, non-adaptive sensing mechanisms. Sparsity is also used to enhance interpretability in machine learning and statistics applications: While the ambient dimension is vast in modern data analysis problems, the relevant information therein typically resides in a much lower dimensional s...
متن کاملConvex Optimization For Non-Convex Problems via Column Generation
We apply column generation to approximating complex structured objects via a set of primitive structured objects under either the cross entropy or L2 loss. We use L1 regularization to encourage the use of few structured primitive objects. We attack approximation using convex optimization over an infinite number of variables each corresponding to a primitive structured object that are generated ...
متن کاملComponents Separation in Optical Imaging with Block-structured Sparse Model
We propose a sources separation method for extracting components in highly perturbed optical imaging videos. We reconstruct the observed signal as the sum of linear representation of the components. Assuming sparsity and morphological diversity, the linear representation of each component by a well designed operator contains only a few coefficients. We regularize the separation problem with blo...
متن کاملHow to exploit prior information in low-complexity models
Compressed Sensing refers to extracting a lowdimensional structured signal of interest from its incomplete random linear observations. A line of recent work has studied that, with the extra prior information about the signal, one can recover the signal with much fewer observations. For this purpose, the general approach is to solve weighted convex function minimization problem. In such settings...
متن کاملStructured Sparsity Models for Multiparty Speech Recovery from Reverberant Recordings
We tackle the multi-party speech recovery problem through modeling the acoustic of the reverberant chambers. Our approach exploits structured sparsity models to perform room modeling and speech recovery. We propose a scheme for characterizing the room acoustic from the unknown competing speech sources relying on localization of the early images of the speakers by sparse approximation of the spa...
متن کامل